41,125 research outputs found
On asymptotics of the beta-coalescents
We show that the total number of collisions in the exchangeable coalescent
process driven by the beta measure converges in distribution to a
1-stable law, as the initial number of particles goes to infinity. The stable
limit law is also shown for the total branch length of the coalescent tree.
These results were known previously for the instance , which corresponds
to the Bolthausen--Sznitman coalescent. The approach we take is based on
estimating the quality of a renewal approximation to the coalescent in terms of
a suitable Wasserstein distance. Application of the method to beta
-coalescents with leads to a simplified derivation of the known
-stable limit. We furthermore derive asymptotic expansions for the
moments of the number of collisions and of the total branch length for the beta
-coalescent by exploiting the method of sequential approximations.Comment: 25 pages, submitted for publicatio
Visibly Linear Dynamic Logic
We introduce Visibly Linear Dynamic Logic (VLDL), which extends Linear
Temporal Logic (LTL) by temporal operators that are guarded by visibly pushdown
languages over finite words. In VLDL one can, e.g., express that a function
resets a variable to its original value after its execution, even in the
presence of an unbounded number of intermediate recursive calls. We prove that
VLDL describes exactly the -visibly pushdown languages. Thus it is
strictly more expressive than LTL and able to express recursive properties of
programs with unbounded call stacks.
The main technical contribution of this work is a translation of VLDL into
-visibly pushdown automata of exponential size via one-way alternating
jumping automata. This translation yields exponential-time algorithms for
satisfiability, validity, and model checking. We also show that visibly
pushdown games with VLDL winning conditions are solvable in triply-exponential
time. We prove all these problems to be complete for their respective
complexity classes.Comment: 25 Page
Isospectral Alexandrov Spaces
We construct the first non-trivial examples of compact non-isometric
Alexandrov spaces which are isospectral with respect to the Laplacian and not
isometric to Riemannian orbifolds. This construction generalizes independent
earlier results by the authors based on Schueth's version of the torus method.Comment: 15 pages, no figures; minor clarification
Sampling from Stochastic Finite Automata with Applications to CTC Decoding
Stochastic finite automata arise naturally in many language and speech
processing tasks. They include stochastic acceptors, which represent certain
probability distributions over random strings. We consider the problem of
efficient sampling: drawing random string variates from the probability
distribution represented by stochastic automata and transformations of those.
We show that path-sampling is effective and can be efficient if the
epsilon-graph of a finite automaton is acyclic. We provide an algorithm that
ensures this by conflating epsilon-cycles within strongly connected components.
Sampling is also effective in the presence of non-injective transformations of
strings. We illustrate this in the context of decoding for Connectionist
Temporal Classification (CTC), where the predictive probabilities yield
auxiliary sequences which are transformed into shorter labeling strings. We can
sample efficiently from the transformed labeling distribution and use this in
two different strategies for finding the most probable CTC labeling
Robust 1-Bit Compressed Sensing via Hinge Loss Minimization
This work theoretically studies the problem of estimating a structured
high-dimensional signal from noisy -bit Gaussian
measurements. Our recovery approach is based on a simple convex program which
uses the hinge loss function as data fidelity term. While such a risk
minimization strategy is very natural to learn binary output models, such as in
classification, its capacity to estimate a specific signal vector is largely
unexplored. A major difficulty is that the hinge loss is just piecewise linear,
so that its "curvature energy" is concentrated in a single point. This is
substantially different from other popular loss functions considered in signal
estimation, e.g., the square or logistic loss, which are at least locally
strongly convex. It is therefore somewhat unexpected that we can still prove
very similar types of recovery guarantees for the hinge loss estimator, even in
the presence of strong noise. More specifically, our non-asymptotic error
bounds show that stable and robust reconstruction of can be achieved with
the optimal oversampling rate in terms of the number of
measurements . Moreover, we permit a wide class of structural assumptions on
the ground truth signal, in the sense that can belong to an arbitrary
bounded convex set . The proofs of our main results
rely on some recent advances in statistical learning theory due to Mendelson.
In particular, we invoke an adapted version of Mendelson's small ball method
that allows us to establish a quadratic lower bound on the error of the first
order Taylor approximation of the empirical hinge loss function
- β¦